It will be up to the two special counsels to investigate and weigh the handling of secret documents by President Biden and former President Donald Trump. But the current questions should not obscure an enormous problem that has been festering for decades and threatens national security, democracy and accountability: The classification system for managing secrets is overwhelmed and desperately needs repair.
Too much national security information is classified, and too little declassified. For years, officials have stamped documents “secret” in a lowest-common denominator system that did not penalize over-classification and made declassification difficult and time-consuming. For example, in November, a 2004 interview of President George W. Bush and Vice President Dick Cheney with the 9/11 Commission was released to the public. It should not have taken 18 years.
A House Republican overseeing national security put it this way: “The United States today attempts to shield an immense and growing body of secrets using an incomprehensibly complex system of classifications and safeguard requirements. As a result, no one can say with any degree of certainty how much is classified, how much needs to be declassified, or whether the nation’s real secrets can be adequately protected in a system so bloated, it often does not distinguish between the critically important and the economically irrelevant. This much we know: There are too many secrets.” That was Rep. Christopher Shays (R-Conn.) speaking 18 years ago, and the situation is worse today.
Over-classification is counterproductive, making it harder for agencies to function, draining budgets and eroding public confidence. Agencies put their best people to work on the most urgent problems, and declassification is a low priority. Now comes a “tsunami,” as the Public Interest Declassification Board warned two years ago: an explosion of digital information. Yet management of classified materials “largely follows established analog and paper-based models.” The board suggested in a blog post that the Freedom of Information Act backlog for records at the George W. Bush Presidential Library “will take a generation to process. This is not acceptable in our democracy.”
A good start would be to simplify the classification process into two tiers, “secret” and “top secret,” eliminating the lower “confidential” level, while protecting those secrets that need special handling. At the same time, the federal board outlined a vision for a modernized classification system that would utilize the tools of big data, artificial intelligence, machine learning, and cloud storage and retrieval. The idea of automation gives some people pause, but increasingly it seems to make good sense; the mountain of data is already unmanageable.
A panel of government experts met recently at the Hudson Institute and recommended using more technology to assist human decisions for classification and declassification, noting that “the growing volume of classified records already exceeds the ability of humans alone to process them.”
That’s a wake-up call. The whole system needs to be fixed, and its dysfunction should not be ignored for another decade.